Hierarchical Low-Rank Tensors for Multilingual Transfer Parsing
نویسندگان
چکیده
Accurate multilingual transfer parsing typically relies on careful feature engineering. In this paper, we propose a hierarchical tensor-based approach for this task. This approach induces a compact feature representation by combining atomic features. However, unlike traditional tensor models, it enables us to incorporate prior knowledge about desired feature interactions, eliminating invalid feature combinations. To this end, we use a hierarchical structure that uses intermediate embeddings to capture desired feature combinations. Algebraically, this hierarchical tensor is equivalent to the sum of traditional tensors with shared components, and thus can be effectively trained with standard online algorithms. In both unsupervised and semi-supervised transfer scenarios, our hierarchical tensor consistently improves UAS and LAS over state-of-theart multilingual transfer parsers and the traditional tensor model across 10 different languages.1
منابع مشابه
An Introduction to Hierarchical (H-) Rank and TT-Rank of Tensors with Examples
We review two similar concepts of hierarchical rank of tensors (which extend the matrix rank to higher order tensors): the TT-rank and the H-rank (hierarchical or H-Tucker rank). Based on this notion of rank, one can define a data-sparse representation of tensors involving O(dnk + dk 3) data for order d tensors with mode sizes n and rank k. Simple examples underline the differences and similari...
متن کاملLow-Rank Tensors for Scoring Dependency Structures
Accurate scoring of syntactic structures such as head-modifier arcs in dependency parsing typically requires rich, highdimensional feature representations. A small subset of such features is often selected manually. This is problematic when features lack clear linguistic meaning as in embeddings or when the information is blended across features. In this paper, we use tensors to map high-dimens...
متن کاملA Randomized Tensor Train Singular Value Decomposition
The hierarchical SVD provides a quasi-best low rank approximation of high dimensional data in the hierarchical Tucker framework. Similar to the SVD for matrices, it provides a fundamental but expensive tool for tensor computations. In the present work we examine generalizations of randomized matrix decomposition methods to higher order tensors in the framework of the hierarchical tensors repres...
متن کاملDynamical Approximation by Hierarchical Tucker and Tensor-Train Tensors
We extend results on the dynamical low-rank approximation for the treatment of time-dependent matrices and tensors (Koch & Lubich, 2007 and 2010) to the recently proposed Hierarchical Tucker tensor format (HT, Hackbusch & Kühn, 2009) and the Tensor Train format (TT, Oseledets, 2011), which are closely related to tensor decomposition methods used in quantum physics and chemistry. In this dynamic...
متن کاملDynamical approximation of hierarchical Tucker and tensor-train tensors
We extend results on the dynamical low-rank approximation for the treatment of time-dependent matrices and tensors (Koch & Lubich, 2007 and 2010) to the recently proposed Hierarchical Tucker tensor format (HT, Hackbusch & Kühn, 2009) and the Tensor Train format (TT, Oseledets, 2011), which are closely related to tensor decomposition methods used in quantum physics and chemistry. In this dynamic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015